Cost approximation algorithms with nonmonotone line searches for a general class of nonlinear programs

نویسنده

  • Michael Patriksson
چکیده

When solving ill-conditioned nonlinear programs by descent algorithms, the descent requirement may induce the step lengths to become very small, thus resulting in very poor performances. Recently, suggestions have been made to circumvent this problem, among which is a class of approaches in which the objective value may be allowed to increase temporarily. Grippo et al. [GLL91] introduce nonmonotone line searches in the class of deflected gradient methods in unconstrained differentiable optimization; this technique allows for longer steps (typically of unit length) to be taken, and is successfully applied to some ill-conditioned problems. This paper extends their nonmonotone approach and convergence results to the large class of cost approximation algorithms of Patriksson [Pat93b], and to optimization problems with both convex constraints and nondifferentiable objective functions.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A new class of Conjugate Gradient Methods with extended Nonmonotone Line Search

In this paper, we propose a new nonlinear conjugate gradient method for large-scale unconstrain optimization which possesses the following properties:(i)the sufficient descent condition −g k dk ≥ 7 8 ‖gk‖ 2 holds without any line searches;(ii)With exact line search, this method reduces to a nonlinear version of the Liu-Storey conjugate gradient scheme.(iii)Under some assumption, global converge...

متن کامل

On efficiency of nonmonotone Armijo-type line searches

Abstract Monotonicity and nonmonotonicity play a key role in studying the global convergence and the efficiency of iterative schemes employed in the field of nonlinear optimization, where globally convergent and computationally efficient schemes are explored. This paper addresses some features of descent schemes and the motivation behind nonmonotone strategies and investigates the efficiency of...

متن کامل

The Global Convergence of Self-Scaling BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems

The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is...

متن کامل

The Global Convergence of Self-scale BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems

The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is...

متن کامل

Nonmonotone derivative-free methods for nonlinear equations

In this paper we study nonmonotone globalization techniques, in connection with iterative derivative-free methods for solving a system of nonlinear equations in several variables. First we define and analyze a class of nonmonotone derivative-free linesearch techniques for uncon-strained minimization of differentiable functions. Then we introduce a globalization scheme, which combines nonmonoton...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011